Variational Inference For Probabilistic Latent Tensor Factorization with KL Divergence

نویسندگان

  • Beyza Ermis
  • Yusuf Kenan Yilmaz
  • Ali Taylan Cemgil
  • Evrim Acar
چکیده

Probabilistic Latent Tensor Factorization (PLTF) is a recently proposed probabilistic framework for modelling multi-way data. Not only the common tensor factorization models but also any arbitrary tensor factorization structure can be realized by the PLTF framework. This paper presents full Bayesian inference via variational Bayes that facilitates more powerful modelling and allows more sophisticated inference on the PLTF framework. We illustrate our approach on model order selection and link prediction. Keywords-Probabilistic Latent Tensor Factorization(PLTF); Variational Bayes(VB); Link Prediction; missing data

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Fast Parallel Randomized Algorithm for Nonnegative Matrix Factorization with KL Divergence for Large Sparse Datasets

Nonnegative Matrix Factorization (NMF) with Kullback-Leibler Divergence (NMF-KL) is one of the most significant NMF problems and equivalent to Probabilistic Latent Semantic Indexing (PLSI), which has been successfully applied in many applications. For sparse count data, a Poisson distribution and KL divergence provide sparse models and sparse representation, which describe the random variation ...

متن کامل

Probabilistic Latent Tensor Factorization

We develop a probabilistic framework for multiway analysis of high dimensional datasets. By exploiting a link between graphical models and tensor factorization models we can realize any arbitrary tensor factorization structure, and many popular models such as CP or TUCKER models with Euclidean error and their non-negative variants with KL error appear as special cases. Due to the duality betwee...

متن کامل

Algorithms for probabilistic latent tensor factorization

We propose a general probabilistic framework for modelling multiway data. Our approach establishes a novel link between graphical representation of probability measures and tensor factorization models that allow us to design arbitrary tensor factorization models while retaining simplicity. Using an expectation-maximization (EM) approach for maximizing the likelihood of the exponential dispersio...

متن کامل

A Bayesian Tensor Factorization Model via Variational Inference for Link Prediction

Probabilistic approaches for tensor factorization aim to extract meaningful structure from incomplete data by postulating low rank constraints. Recently, variational Bayesian (VB) inference techniques have successfully been applied to large scale models. This paper presents full Bayesian inference via VB on both single and coupled tensor factorization models. Our method can be run even for very...

متن کامل

Learning to Draw Samples with Amortized Stein Variational Gradient Descent

We propose a simple algorithm to train stochastic neural networks to draw samples from given target distributions for probabilistic inference. Our method is based on iteratively adjusting the neural network parameters so that the output changes along a Stein variational gradient direction (Liu & Wang, 2016) that maximally decreases the KL divergence with the target distribution. Our method work...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • CoRR

دوره abs/1409.8083  شماره 

صفحات  -

تاریخ انتشار 2014